Web Survey Bibliography
Title Response Rates and Response Bias in Web Panel Surveys
Year 2015
Access date 22.08.2016
Abstract
Non-probability samples, such as online panels, are increasingly accepted as “fit for purpose” for low incidence populations (e.g., pregnant women), difficult to reach populations (e.g., health care workers) and other special populations, particularly when time or cost make probability surveys infeasible. However, there is much less enthusiasm for the application of these methods in social science research for general populations. Aside from the issue of statistical generalizability, low response rates within the panel and demographic biases in the achieved samples are often cited (AAPOR 2010).
Are low response rates and demographic biases endemic to population surveys using web panels, or do they reflect the methods of particular surveys? Many web panel surveys are conducted in such a way that response rate cannot be calculated. In other cases, response rate is not reported. Further, most web surveys are not conducted to optimize response rate since sample is nearly unlimited and speed is often critically important to the client. In addition, biases in web surveys are usually identified by comparing the characteristics of the achieved sample to the population, which does not address the source of the error as the frame or the survey procedures.
This paper examines the application of two survey protocols in a general population survey conducted in the same community using a national web panel. Invitations will be sent to two Census balanced samples of 5,000 from the master panel, with the goal of achieving at least 500 completes in each sample. For the first protocol, invitations will be followed by a single reminder, an industry standard. For the second protocol, a robust reminder schedule including up to 4 reminders will be fielded over a three week period. Response rate is calculated as the proportion of invited respondents who complete the interview. Non-response bias is calculated by comparing the characteristics of responders and non-responders from their panel profile. Findings are compared across the two samples from the same community in the experiment.
Are low response rates and demographic biases endemic to population surveys using web panels, or do they reflect the methods of particular surveys? Many web panel surveys are conducted in such a way that response rate cannot be calculated. In other cases, response rate is not reported. Further, most web surveys are not conducted to optimize response rate since sample is nearly unlimited and speed is often critically important to the client. In addition, biases in web surveys are usually identified by comparing the characteristics of the achieved sample to the population, which does not address the source of the error as the frame or the survey procedures.
This paper examines the application of two survey protocols in a general population survey conducted in the same community using a national web panel. Invitations will be sent to two Census balanced samples of 5,000 from the master panel, with the goal of achieving at least 500 completes in each sample. For the first protocol, invitations will be followed by a single reminder, an industry standard. For the second protocol, a robust reminder schedule including up to 4 reminders will be fielded over a three week period. Response rate is calculated as the proportion of invited respondents who complete the interview. Non-response bias is calculated by comparing the characteristics of responders and non-responders from their panel profile. Findings are compared across the two samples from the same community in the experiment.
Access/Direct link FCSM Research Conference Homepage (Abstract) / (Full text)
Year of publication2015
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - 2015 (291)
- Taking MARS Digital; 2015; Melton, E.; Krahn, J.
- A Comparison of the Effects of Face-to-Face and Online Deliberation on Young Students’ Attitudes...; 2015; Triantafillidou, A.; Yannas, P.; Lappas, G.; Kleftodimos, A.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- Doing Online Surveys: Zum Einsatz in der sozialwissenschaftlichen Raumforschung; 2015; Nadler, R.; Petzold, K.; Schoenduwe, R.
- Are Fast Responses More Random? Testing the Effect of Response Time on Scale in an Online Choice Experiment...; 2015; Boerger, T.
- The impact of frequency rating scale formats on the measurement of latent variables in web surveys -...; 2015; Menold, N.; Kemper, C. J.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Implementation of the forced answering option within online surveys: Do higher item response rates come...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Internet Panels, Professional Respondents, and Data Quality; 2015; Matthijsse, S.; De Leeuw, E. D.; Hox, J.
- Self-administered Questions and Interviewer–Respondent Familiarity; 2015; Rodriguez, L. A., Sana, M., Sisk, B.
- Comparing Food Label Experiments Using Samples from Web Panels versus Mall Intercepts; 2015; Chang, L. C., Lin, C. T. J.
- Translating Answers to Open-ended Survey Questions in Cross-cultural Research: A Case Study on the Interplay...; 2015; Behr, D.
- The impact of gamifying to increase spontaneous awareness; 2015; Cape, P.
- Using eye-tracking to understand how fourth grade students answer matrix items; 2015; Maitland, A.; Sun, H.; Caporaso, A.; Tourangeau, R.; Bertling, J.; Almonte, D.
- Incentive Types and Amounts in a Web-based Survey of College Students; 2015; Krebs, C.; Planty, M.; Stroop, J.; Berzofsky, M.; Lindquist, C.
- Response Rates and Response Bias in Web Panel Surveys; 2015; Boyle, J.; Berman, L.; Dayton, Ja.; Fakhouri, T.; Iachan, R.; Courtright, M.; Pashupati, K.
- Characteristics of the Population of Internet Panel Members; 2015; Boyle, J; Freedner, N.; Fakhouri, T.
- Internet and Smartphone Coverage in a National Health Survey: Implications for Alternative Modes; 2015; Couper, M. P.; Kelley, J.; Axinn, W.; Guyer, H.; Wagner, J.; West, B. T.
- An Overview of Mobile CATI Issues in Europe; 2015; Slavec, A.; Toninelli, D.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.
- Willingness of Online Access Panel Members to Participate in Smartphone Application-Based Research; 2015; Pinter, R.
- Who Has Access to Mobile Devices in an Online Opt-in Panel? An Analysis of Potential Respondents for...; 2015; Revilla, M.; Toninelli, D.; Ochoa, C.; Loewe, G.
- Who Are the Internet Users, Mobile Internet Users, and Mobile-Mostly Internet Users?: Demographic Differences...; 2015; Antoun, C.
- A Meta-Analysis of Breakoff Rates in Mobile Web Surveys; 2015; Mavletova, A. M.; Couper, M. P.
- The Best of Both Worlds? Combining Passive Data with Survey Data, its Opportunities, Challenges and...; 2015; Duivenvoorde, S.; Dillon, A.
- Optimizing the Decennial Census for Mobile – A Case Study; 2015; Nichols, E. M.; Hawala, E. O.; Horwitz, R.; Bentley, M.
- App vs. Web for Surveys of Smartphone Users: Experimenting with mobile apps for signal-contingent experience...; 2015; McGeeney, K.; Keeter, S.; Igielnik, R.; Smith, A.; Rainie, L.
- Using Video to Reinvigorate the Open Question; 2015; Cape, P.
- On the Go: How Mobile Participants Affect Survey Results; 2015; Barlas, F. M.; Thomas, R. K.
- The Matrix Lives On: Improving Grids for Online Surveys; 2015; Thomas, R. K.; Barlas, F. M.; Graham, P.; Subias, T.
- Variance Estimation for Surveys from Internet Panels ; 2015; Rivers, D.
- Sensitivity Analysis of Bias of Estimates from Web Surveys with Nonrandomized Panel Selection; 2015; Beresovsky, V.
- Detecting Fraud in a Survey Sample Recruited Online; 2015; Brown, D.; Dever, J. A.; Augustson, E.; Squiers, L.
- Survey Treatments and Response Modes: Bayesian Survival Analysis with Competing Risks; 2015; Minato, H.
- Purposefully Mobile: Experimentally Assessing Device Effects in an Online Survey ; 2015; Barlas, F. M.; Thomas, R. K.; Graham, P.
- Using equivalence testing to disentangle selection and measurement in mixed modes surveys ; 2015; Cernat, A.
- What do web survey panel respondents answer when asked “Do you have any other comment?”; 2015; Schonlau, M.
- On Climbing Stairs Many Steps at a Time: The New Normal in Survey Methodology; 2015; Dillman, D. A.
- Mobile Research Methods: Opportunities and challenges of mobile research methodologies. ; 2015; Toninelli, D. (Ed.); Pinter, R.; de Pedraza, P.
- Effect of Web-Based Versus Paper-Based Questionnaires and Follow-Up Strategies on Participation Rates...; 2015; Kilsdonk, E.; van den Heuvel-Eibrink, M. M.; van Dulmen-den Broeder, E.; van der Pal, H. J. H.; van...
- Polling Error in the 2015 UK General Election: An Analysis of YouGov’s Pre and Post-Election Polls...; 2015; Wells, A.; Rivers, D.
- Cell Phone and Face-to-face Interview Responses in Population-based Sur- veys - How Do They Compare?; 2015; Ghandour, L.; Ghandour, B.; Mahfoud, Z.; Mokdad, A.; Sibai, A. M.
- Collecting Health Research Data - Comparing Mobile Phone-assisted Personal Interviewing to Paper-and...; 2015; van Heerden, A. C.; Norris, S. A.; Tollman, S. M.; Richter, L. M.
- The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability...; 2015; Bosnjak, M.; Struminskaya, B.; Weyandt, K.
- Are Sliders Too Slick for Surveys? An Experiment Comparing Slider and Radio Button Scales for Smartphone...; 2015; Aadland, D.; Aalberg, T.
- Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study; 2015; Arn, B.; Klug, S.; Kolodziejski, J.
- Maximizing Data Quality using Mode Switching in Mixed-Device Survey Design: Nonresponse Bias and Models...; 2015; Axinn, W.; Gatny, H. H.; Wagner, J.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Measuring Political Knowledge in Web-Based Surveys: An Experimental Validation of Visual Versus Verbal...; 2015; Munzert, S.; Selb, P.
- Validation of the new scale for measuring behaviors of Facebook users: Psycho-Social Aspects of Facebook...; 2015; Bodroza, B.; Jovanovic, T.